Jointly Compatible Pair Linking for Visual Tracking with Probabilistic Priors
نویسنده
چکیده
Video sequences of real-world situations are often di cult to track with machine vision. Scenes frequently contain visual clutter, repetitive textures and occlusions that make online visual feature tracking di cult. If the camera is allowed to shake or moving objects are present, the exponential searchspace of potential feature matches rapidly becomes intractable for real-time applications. In this paper we introduce Jointly Compatible Pair Linking (JCPL), an algorithm that e ciently and deterministically identi es the most globally consensual set of feature-measurement matches in tracking problems with probabilistic priors. We demonstrate JCPL as part of a two-stage visual tracking algorithm, showing it correctly resolving signi cant matching ambiguities in sequences with highly dynamic camera motion while robustly ignoring moving scene objects. In these experiments we show JCPL and the twostage tracker evaluating a xed number of tests in an exponential search-space. In one experiment JCPL tested less than 1/200th of the total search space and executed 4.6 times faster than the current goldstandard algorithm Joint Compatibility Branch and Bound (JCBB). Given highly ambiguous sequences we show JCPL tracking successfully while standard JCBB chooses incorrect matches and fails. Throughout our experiments the number of costly image matching operations are minimised, where in a typical sequence only 20.4% of the full image matching operations are required.
منابع مشابه
Adaptive Probabilistic Visual Tracking with Incremental Subspace Update
Visual tracking, in essence, deals with non-stationary data streams that change over time. While most existing algorithms are able to track objects well in controlled environments, they usually fail if there is a significant change in object appearance or surrounding illumination. The reason being that these visual tracking algorithms operate on the premise that the models of the objects being ...
متن کاملContextualized trajectory parsing with spatiotemporal graph.
This work investigates how to automatically parse object trajectories in surveillance videos, which aims at jointly solving three subproblems: 1) spatial segmentation, 2) temporal tracking, and 3) object categorization. We present a novel representation spatiotemporal graph (ST-Graph) in which: 1) Graph nodes express the motion primitives, each representing a short sequence of small-size patche...
متن کاملJoint Headlight Pairing and Vehicle Tracking by weighted Set Packing in Nighttime Traffic Videos
We propose a Set Packing (SP) framework for joint headlight pairing and vehicle tracking. Given headlight detections, traditional nighttime vehicle tracking methods usually first pair headlights and then track these pairs. However, the poor photometric condition often introduces tremendous noises in headlight detection and pairing, which leads to unrecoverable errors for vehicle tracking. To ov...
متن کاملEstimating the Parameters for Linking Unstandardized References with the Matrix Comparator
This paper discusses recent research on methods for estimating configuration parameters for the Matrix Comparator used for linking unstandardized or heterogeneously standardized references. The matrix comparator computes the aggregate similarity between the tokens (words) in a pair of references. The two most critical parameters for the matrix comparator for obtaining the best linking results a...
متن کاملProbabilistic Tracking of Multiple Speakers in Meetings
Tracking speakers in multiparty conversations constitutes a fundamental task for automatic meeting analysis. In this paper, we present a probabilistic approach to jointly track the location and speaking activity of multiple speakers in a multisensor meeting room, equipped with a small microphone array and multiple uncalibrated cameras. Our framework is based on a mixed-state dynamic graphical m...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2011